Web Survey Bibliography
All social surveys suffer from different types of errors, of which one of the most studied is non-response bias. Non-response bias is a systematic error that occurs because individuals differ in their accessibility and propensity to participate in a survey according to their own characteristics as well as those from the survey itself. The extent of the problem heavily depends on the correlation between response mechanisms and key survey variables. However, non-response bias is difficult to measure or to correct for due to the lack of relevant data about the whole target population or sample. In this paper, non-response follow-up surveys are considered as a possible source of information about non-respondents. Non-response follow-ups, however, suffer from two methodological issues: they themselves operate through a response mechanism that can cause potential non-response bias, and they pose a problem of comparability of measure, mostly because the survey design differs between main survey and non-response follow-up. In order to detect possible bias, the survey variables included in non-response surveys have to be related to the mechanism of participation, but not be sensitive to measurement effects due to the different designs. Based on accumulated experience of four similar non-response follow-ups, we studied the survey variables that fulfill these conditions. We differentiated socio-demographic variables that are measurement-invariant but have a lower correlation with non-response and variables that measure attitudes, such as trust, social participation, or integration in the public sphere, which are more sensitive to measurement effects but potentially more appropriate to account for the non-response mechanism. Our results show that education level, work status, and living alone, as well as political interest, satisfaction with democracy, and trust in institutions are pertinent variables to include in non-response follow-ups of general social surveys.
All social surveys suffer from different types of errors, of which one of the most studied is non-response bias. Non-response bias is a systematic error that occurs because individuals differ in their accessibility and propensity to participate in a survey according to their own characteristics as well as those from the survey itself. The extent of the problem heavily depends on the correlation between response mechanisms and key survey variables. However, non-response bias is difficult to measure or to correct for due to the lack of relevant data about the whole target population or sample. In this paper, non-response follow-up surveys are considered as a possible source of information about non-respondents. Non-response follow-ups, however, suffer from two methodological issues: they themselves operate through a response mechanism that can cause potential non-response bias, and they pose a problem of comparability of measure, mostly because the survey design differs between main survey and non-response follow-up. In order to detect possible bias, the survey variables included in non-response surveys have to be related to the mechanism of participation, but not be sensitive to measurement effects due to the different designs. Based on accumulated experience of four similar non-response follow-ups, we studied the survey variables that fulfill these conditions. We differentiated socio-demographic variables that are measurement-invariant but have a lower correlation with non-response and variables that measure attitudes, such as trust, social participation, or integration in the public sphere, which are more sensitive to measurement effects but potentially more appropriate to account for the non-response mechanism. Our results show that education level, work status, and living alone, as well as political interest, satisfaction with democracy, and trust in institutions are pertinent variables to include in non-response follow-ups of general social surveys. - See more at: https://ojs.ub.uni-konstanz.de/srm/article/view/6138#sthash.CEiOvCVB.dpuf
Web survey bibliography - Other (439)
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Comparing online and telephone survey results in the context of a skin cancer prevention campaign evaluation...; 2016; Hollier, L.P.; Pettigrew, S.; Slevin, T.; Strickland, M.; Minto, C.
- Sample Representation and Substantive Outcomes Using Web With and Without Incentives Compared to Telephone...; 2016; Lipps, O.; Pekari, N.
- The Dynamic Identity Fusion Index: A New Continuous Measure of Identity Fusion for Web-Based Questionnaires...; 2016; Jimenez, J.; Gomez, A.; Buhrmester, M.; Whitehouse, H.; Swann, W. B.
- Collecting Data from mHealth Users via SMS Surveys: A Case Study in Kenya; 2016; Johnson, D.
- When Should I Call You? An Analysis of Differences in Demographics and Responses According to Respondents...; 2016; Vicente, P.; Lopes, I.
- The use and positioning of clarification features in web surveys; 2016; Metzler, A., Kunz, T., Fuchs, M.
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- Identifying Pertinent Variables for Nonresponse Follow-Up Surveys. Lessons Learned from 4 Cases in Switzerland...; 2016; Vandenplas, C.; Joye, D.; Staehli, M. E.; Pollien, A.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- A Framework of Incorporating Thai Social Networking Data in Online Marketing Survey; 2016; Jiamthapthaksin, R.; Aung, T. H.; Ratanasawadwat, N.
- Internet Abusive Use Questionnaire: Psychometric properties; 2016; Calvo-Frances, F.
- The impact of academic sponsorship on Web survey dropout and item non-response; 2016; Allen, P. J.; Roberts, L. D.
- A Statistical Approach to Provide Individualized Privacy for Surveys; 2016; Esponda, F.; Huerta, K.; Guerrero, V. M.
- Quality of Different Scales in an Online Survey in Mexico and Colombia; 2016; Revilla, M.; Ochoa, C.
- Presentation matters: how mode effects in item non-response depend on the presentation of response options...; 2016; Zeglovits, E.; Schwarzer, S.
- Exploring Factors in Contributing Student Progress in the Open University; 2016; Arifin, M. H.
- Taking MARS Digital; 2015; Melton, E.; Krahn, J.
- A Comparison of the Effects of Face-to-Face and Online Deliberation on Young Students’ Attitudes...; 2015; Triantafillidou, A.; Yannas, P.; Lappas, G.; Kleftodimos, A.
- Doing Online Surveys: Zum Einsatz in der sozialwissenschaftlichen Raumforschung; 2015; Nadler, R.; Petzold, K.; Schoenduwe, R.
- Using Mobile Phones for High-Frequency Data Collection; 2015; Azevedo, J. P.; Ballivian, A.; Durbin, W.
- Willingness of Online Access Panel Members to Participate in Smartphone Application-Based Research; 2015; Pinter, R.
- Who Has Access to Mobile Devices in an Online Opt-in Panel? An Analysis of Potential Respondents for...; 2015; Revilla, M.; Toninelli, D.; Ochoa, C.; Loewe, G.
- Cell Phone and Face-to-face Interview Responses in Population-based Sur- veys - How Do They Compare?; 2015; Ghandour, L.; Ghandour, B.; Mahfoud, Z.; Mokdad, A.; Sibai, A. M.
- Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study; 2015; Arn, B.; Klug, S.; Kolodziejski, J.
- Web Surveys Optimized for Smartphones: Are there Differences Between Computer and Smartphone Users?; 2015; Andreadis, I.
- Validation of the new scale for measuring behaviors of Facebook users: Psycho-Social Aspects of Facebook...; 2015; Bodroza, B.; Jovanovic, T.
- Participation rates, response bias and response behaviours in the community survey of the Swiss Spinal...; 2015; Fekete, C.; Segerer, W.; Gemperli, A.; Brinkhof, M.W.G.
- Comparison of telephone RDD and online panel survey modes on CPGI scores and co-morbidities; 2015; Lee, C.-K.; Back, K.-J.; Williams, Ro. J.; Ahn, S.-S.
- Hidden Populations, Online Purposive Sampling, and External Validity: Taking off the Blindfold; 2015; Barrat, M. J.; Ferris, J. A.; Lenton, S.
- Impact of raising awareness of respondents on the measurement quality in a web survey; 2015; Revilla, M.
- Open narrative questions in PC and smartphones: is the device playing a role?; 2015; Revilla, M.; Ochoa, C.
- Can we augment web responses with telephonic responses to a graduate destination survey?; 2015; du Toit, J.
- Comparison of Internet and interview survey modes when estimating willingness to pay using choice experiments...; 2015; Mjelde, J. W.; Kim, T. K.; Lee, C.-K.
- Suggestions for international research using electronic surveys; 2015; e Silva, S. C.; Duarte, P.
- Effects of Forced Responses and Question Display Styles on Web Survey Response Rates; 2015; Tangmanee, C.; Niruttinanon, P.
- Using Internet to Recruit Immigrants with Language and Culture Barriers for Tobacco and Alcohol Use...; 2015; Carlini, B. H.; Safioti, L.; Rue, T. C.; Miles, L.
- Recruiting Online: Lessons From a Longitudinal Survey of Contraception and Pregnancy Intentions of Young...; 2015; Harris, M. L.; Loxton, D.; Wigginton, B.; Lucke, J. C.
- Recruiting for addiction research via Facebook; 2015; Thornton, L. K.; Harris, K.; Baker, A.; Johnson, M.; Kay-Lambkin, F. J.
- Can a non-probabilistic online panel achieve question quality similar to that of the European Social...; 2015; Revilla, M.; Saris, W. E.; Loewe, G.; Ochoa, C.
- The quality of responses to grid questions as used in Web questionnaires (compared with paper questionnaires...; 2015; Dominguez, J. A.; de Rada, V. D.
- What are the Links in a Web Survey Among Response Time, Quality, and Auto-Evaluation of the Efforts...; 2015; Revilla, M.; Ochoa, C.
- Impact of mixed modes on measurement errors and estimates of change in panel data; 2015; Cernat, A.
- Probabilistic Web Survey Methodology in Education Centers: An Example in Spanish Schools; 2015; Tapia, J. A., Menendez, J. A.
- Offline recruiting of young people for an online survey - what affects response rates; 2015; Zeglovits, E.
- Smartphones @work; 2015; Bittman, M.
- Comparison of different mixed-mode and face - to face surveys - response rates and costs; 2015; Ainsaar, M.; Hendrikson, R.
- Online Eye-Tracking of Dynamic Advertising Content in (Mobile) Web-Surveys; 2015; Berger, S.
- Predicting Response Times in Web Surveys; 2015; Wenz, A.